What Is the U.S. Federal Reserve?
Backgrounder

What Is the U.S. Federal Reserve?

Over the past decade, the Fed kept interest rates low while it deployed trillions of dollars in stimulus and expanded its regulatory oversight. Now, the central bank is back in the spotlight for its battle against inflation.
The Federal Reserve is the most powerful U.S. economic institution.
The Federal Reserve is the most powerful U.S. economic institution. Win McNamee/Getty Images
Summary
  • The Federal Reserve is the most powerful economic institution in the United States. It is responsible for managing monetary policy and regulating the financial system.
  • It does this by setting interest rates, influencing the supply of money in the economy, and, in recent years, making trillions of dollars in asset purchases to boost financial markets.
  • The Fed raised rates after the COVID-19 pandemic and Russian invasion of Ukraine together spurred alarming inflation, but calls are mounting to lower them amid fears of a recession.

Introduction

The U.S. central banking system—the Federal Reserve, or the Fed—is the most powerful economic institution in the United States, and perhaps the world. Its core responsibilities include setting interest rates, managing the money supply, and regulating financial markets. It also acts as a lender of last resort during periods of economic crisis, as demonstrated during the 2008 financial meltdown and the COVID-19 pandemic. Since 2020, the central bank has struggled with how to slow rapid inflation without damaging economic growth.

More From Our Experts

Given the immensity of its powers, the Fed is no stranger to controversy. Some economists have argued that its aggressive policies risk inflation and asset bubbles, while others feel the Fed’s support for financial markets favors big business over workers. The central bank is also one of the most politically independent U.S. government bodies, which has long caused tension with lawmakers and presidents.

What does the Fed do?

More on:

Monetary Policy

Federal Reserve

Budget, Debt, and Deficits

United States

Economic Crises

For most of the nineteenth century, the United States had no central bank to serve as a lender of last resort, leaving the country vulnerable to a series of financial panics and banking runs. In response, Congress passed—and President Woodrow Wilson signed into law—the 1913 Federal Reserve Act, which created a Federal Reserve System of twelve public-private regional banks. The New York Fed, which is responsible for the heart of the nation’s financial life, has long been considered first among equals. It runs the Fed’s trading desks, helps regulate Wall Street, and oversees the largest pool of assets.

Today, the Fed is tasked with managing U.S. monetary policy, regulating bank holding companies and other member banks, and monitoring systemic risk in the financial system. The seven-member Board of Governors, the system’s seat of power, is based in Washington, DC, and currently led by Fed Chair Jerome Powell. Each member is appointed by the president to a fourteen-year term, subject to confirmation by the Senate. The Board of Governors forms part of a larger board, the Federal Open Market Committee (FOMC), which includes five of the twelve regional bank presidents on a rotating basis. The FOMC is responsible for setting interest rate targets and managing the money supply.

Historically, the Fed has been driven by a dual mandate: first, to maintain stable prices, and second, to achieve full employment. The Fed sets an annual inflation target of 2 percent in pursuit of price stability, and economists debate the definition of full employment but generally accept it to mean an unemployment rate of around 4 or 5 percent. (In August 2020, the bank announced that it would begin tolerating periods of higher inflation to make up for periods when it is lower, though it has not yet embraced this practice.)

To fulfill its mandate, the Fed’s most important lever is the buying or selling of U.S. Treasury bonds in the open market to influence banking reserves and interest rates. For instance, the Fed’s purchase of bonds puts more money into the financial system and thus reduces the cost of borrowing. At the same time, the Fed can also make loans to commercial banks, at an interest rate that it sets (known as the discount rate) to increase the money supply.

More From Our Experts

What does the Fed chair do?

Few officials in Washington enjoy the power and autonomy of the chair of the Federal Reserve. They act as a spokesperson for the central bank, negotiate with the executive branch and Congress, and control the agenda of the board and FOMC meetings. Analysts and investors hang on the chair’s every word, and markets instantly react to the faintest clues on interest rate policy.

The chair is appointed by the president, and the Fed, which controls its own budget, is mostly independent from the whims of Congress. Once confirmed, the Fed chair is also largely free of control by the White House; there is no accepted mechanism for a president to remove them, and it is legally uncertain if one could do so at all.

More on:

Monetary Policy

Federal Reserve

Budget, Debt, and Deficits

United States

Economic Crises

Recent Fed chairs include:

Paul Volcker, 19791987. Appointed by President Jimmy Carter, Volcker, previously head of the New York Fed, took over at a time of double-digit inflation and slow growth, known as “stagflation.” To fight inflation he restricted the supply of money in the economy, pushing interest rates to their highest level in history, topping 20 percent. While the immediate result was a recession and high unemployment, many economists say this “shock therapy” set the stage for the country’s 1980s economic boom. President Ronald Reagan replaced Volcker in 1987 after disagreements over rising U.S. debt, high interest rates, and financial regulation.

Alan Greenspan, 19872006. Reagan appointed Greenspan, an economist and former White House advisor, who would go on to serve five terms as Fed chair under four different presidents. A noted inflation hawk and skeptic of government regulation, he was often credited with leading the U.S. economy through its long 1990s expansion. In the wake of the 2008 financial crisis, however, many experts also criticized him for doing little to regulate risky new financial products and allowing a housing bubble to build.

Ben Bernanke, 20062014. Appointed by President George W. Bush, Bernanke’s two terms spanned the worst years of the 2008 crisis and its aftermath, known as the Great Recession. His aggressive response included slashing interest rates to zero, supporting financial institutions on the brink of collapse, and pumping trillions of dollars into financial markets to support liquidity and lending. President Barack Obama reappointed Bernanke to a second term, crediting him with avoiding a total economic collapse.

Janet Yellen, 20142018. After Bernanke announced his retirement in 2013, Obama chose Yellen, a Yale-trained economist and the first woman to head the U.S. central bank. Before becoming chair, Yellen had issued early warnings about the housing crash and pushed for more aggressive monetary policy to bring down unemployment. During her term, as the United States saw a recovery in the labor market, Yellen oversaw the first rise in interest rates in nearly a decade.

Jerome Powell, 2018Present. New presidents have almost always reappointed the sitting Fed chair to a second term, regardless of party. But after Yellen’s first term expired in February 2018, then President Donald Trump replaced her with Powell, a businessman, financier, and sitting Fed governor. President Joe Biden reappointed him in 2022. Although Powell had been skeptical of some of the Fed’s regulations, he initially followed Yellen’s blueprint for slowly increasing interest rates. He has since raised interest rates to their highest level in decades as part of the Fed’s response to inflation caused by fallout from the COVID-19 pandemic and Russian invasion of Ukraine.

How has the Fed’s regulatory role evolved?

The Fed’s regulatory purview steadily expanded through the 1990s. The U.S. banking industry changed dramatically under a 1999 law that legalized the merger of securities, insurance, and banking institutions, and allowed banks to combine retail and investment operations. These two functions had previously been separated under the 1933 Glass-Steagall Act. The changes also made the Fed responsible for ensuring banks’ solvency by enforcing provisions such as minimum capital requirements, consumer protections, antitrust laws, and anti–money laundering policies.

The U.S. financial crisis, which expanded into a global economic crisis beginning in 2008, highlighted the systemic risk embedded in the financial system, and raised questions over the Fed’s oversight. Some economists point to the repeal of Glass-Steagall in particular as the starting gun for a “race to the bottom” among financial regulators, which allowed “too-big-to-fail” institutions to take on dangerous levels of risk. As many assets became “toxic,” especially new types of securities based on risky housing loans, the federal government was forced to step in with trillions of dollars in bailout money to avert the financial system’s collapse.

In the aftermath, debate has continued over how both regulatory changes and monetary policy created the conditions for the crisis. In addition to the Glass-Steagall repeal, regulators in the early 2000s also allowed banks to take on unprecedented levels of debt. Bernanke has blamed excessive debt, lax government regulation, and gaps in oversight of too-big-to-fail banks for the disaster.

In addition, some critics blame the Fed’s long-running policy of low interest rates for contributing to the crisis. Many economists judge Fed policy by the so-called Taylor rule, formulated by Stanford economist John Taylor, which says that interest rates should be raised when inflation or employment rates are high. Taylor and others have argued that then Fed Chair Greenspan’s decision to keep rates low during a period of economic growth helped create the housing bubble by making home loans extremely cheap and encouraging many borrowers to go into debt beyond their means. Greenspan attributed this policy to his belief that the U.S. economy faced the risk of deflation, or a decline in prices, due to a tightening supply of credit.

How did the Fed deal with the Great Recession?

Like other central banks around the world, the Fed immediately slashed interest rates to boost lending and other economic activity. By the end of 2008, it dropped rates to near zero, where they would stay until 2015. Unlike some other central banks, including the European Central Bank, the Fed decided against negative interest rates. It thought that such a move—essentially charging banks for holding their funds with the Fed in order to spur them to lend—was unlikely to have much effect.

However, the Fed did pursue another unorthodox policy, known as quantitative easing, or QE, which refers to the large-scale purchase of assets, including Treasury bonds, mortgage-backed securities, and other debt. Between 2008 and 2014, the Fed’s balance sheet ballooned from about $900 billion to over $4.5 trillion as the central bank launched several rounds of asset buying.

The goal of QE was to further spur lending when all other monetary policy tools had been maxed out. This was thought to work in multiple ways: by taking bad assets off of banks’ balance sheets, by dramatically increasing the supply of money to be lent, and by signaling to banks and investors that the Fed was committed to taking any steps necessary to restore growth.

The move was not without its critics, as some economists feared such an increase in the money supply would cause out-of-control inflation. Many also argued that additional monetary easing would do little at a time of low demand in the economy.

Economists still debate the results of QE in the wake of the recession. Fed officials and others say it helped stabilize the economy, increase lending, and boost employment. Other experts call the policy disappointing, noting the historically slow U.S. recovery and suggesting that it set the stage for postpandemic inflationary conditions. Fears also remain that winding down, or “tapering,” the Fed’s asset purchases has contributed to market instability—leading to several so-called “taper tantrums.”

After 2014, with U.S. growth rebounding and unemployment falling, the Fed sought to return to normalcy. QE purchases ended in 2014, though the Fed did not move to start gradually shrinking its balance sheet until 2017. The Fed also began slowly raising interest rates starting in December 2015, the first increase since 2006.

However, these efforts were interrupted in 2019, as the Fed became worried about slowing global growth and rising trade tensions. In July 2019, Powell announced he was cutting interest rates, which had reached 2.5 percent, and several more cuts followed that year. At the same time, the Fed again started buying assets, at a pace of $60 billion per month, in an attempt to calm volatile financial markets. The pandemic led to an acceleration in purchases as the Fed sought to contain an economic crisis; the bank’s balance sheet doubled between 2020 and 2022, reaching nearly $9 trillion.

What did Dodd-Frank do?

In the wake of the financial crisis, Congress passed a new set of regulations, the 2010 Dodd-Frank Wall Street Reform and Consumer Protection Act. The legislation seeks to reduce systemic risk through a wide range of policies, including new limits on derivatives trading, stricter oversight of banks, and greater consumer protections. A major plank is the so-called Volcker Rule, named after the former Fed chair, which prohibits federally backed banks from proprietary trading, or making risky bets with their depositors’ funds.

Dodd-Frank introduced what is essentially a third official mandate for the Fed, alongside its inflation and employment targets, by expanding its oversight of the financial system. It does that in part via the Fed’s participation in the newly created Financial Stability Oversight Council, which identifies risks to the system and imposes new regulations as needed.

The Fed is also now in charge of keeping a closer eye on banks’ solvency, so it can ensure they have enough reserves to survive another major downturn. All financial firms big enough to pose a risk to the broader economy—known as “systemically important financial institutions”—are evaluated yearly with so-called “stress tests” that simulate the conditions of an economic crisis. These policies together represent a consolidation of oversight in Washington—previously, the regional reserve banks, and the New York Fed in particular, took the lead on regulating banks in their territory. In 2018, Trump signed legislation weakening the Volcker Rule, reducing the number of banks subject to stress tests, and rolling back other Dodd-Frank provisions. Some economists argue that these rollbacks helped create the conditions for the 2023 collapse of Silicon Valley Bank—the third largest bank failure in U.S. history.

How has the Fed responded to postpandemic inflation?

Beginning in early 2020, the pandemic emerged as a major global economic disruption. That March, the Fed responded with an immediate return to its emergency footing, cutting rates to zero and announcing a slew of measures to bolster markets and pump money into the financial system. With an aggressive Fed response and unprecedented fiscal stimulus, the economic recovery was much faster than that following the Great Recession. However, the stimulus, in combination with depressed demand from lockdowns, snarled supply chains, and high global energy prices following the Russian invasion of Ukraine, contributed to the highest inflation rate since the early 1980s. This inflationary environment has led the Fed to use interest rate hikes to try to cool off the economy while avoiding a recession—a goal known as a “soft landing.”

Fed officials initially took the view that higher inflation was temporary. But as higher prices persisted, they began raising interest rates in 2022 at their fastest clip in forty years, and they currently sit at a twenty-three-year high. After months of debate among economists over the appropriate time to begin lowering rates, Powell signaled in July 2024 that the bank could soon do so. Some experts contend that waiting unnecessarily heightened the risk of recession. Others, including CFR’s Roger W. Ferguson, a former Fed vice chairman, note that lowering interest rates while inflation remained high could have further entrenched it. 

At the same time, the Fed’s actions have reverberated beyond the U.S. economy. Its persistent tightening put pressure on other central banks to raise their interest rates to prevent their currencies from falling further against the surging U.S. dollar, writes CFR’s Brad W. Setser, a former U.S. Treasury official. When Japan—where interest rates were far lower than in the United States—raised rates in July 2024, a series of bets on the low value of the yen relative to the U.S. dollar unraveled, leading to a short-lived collapse in the largest Japanese stock index and an associated drop in major American indices.

CFR’s Sebastian Mallaby dubbed the past decades of low interest rates and low inflation the “age of magic money.” With the opposite now occurring—high rates and above-target inflation—he argues that the Fed could be better served with a higher inflation target than the historical 2 percent. Either way, Mallaby maintains, magic money is now “off the table.”

Recommended Resources

CFR’s Brad W. Setser looks at how inflation has created a perilous moment for central banks.

CFR’s Roger W. Ferguson Jr. explains the Fed’s trade-offs between reducing inflation and stimulating growth.

For Foreign Affairs, CFR’s Sebastian Mallaby describes the end of “magic money.”

This Backgrounder by CFR’s Noah Berman unpacks the Dodd-Frank Act.

Federal Reserve economist Jane E. Ihrig and former Federal Reserve economists Ellen E. Meade, and Gretchen C. Weinbach analyze the changing nature [PDF] of the Fed’s policymaking.

Mohammed Aly Sergie, Andrew Chatzky, Megan Fahrney, and Marc Goedemans contributed to this article.

For media inquiries on this topic, please reach out to [email protected].
Close

Top Stories on CFR

Artificial Intelligence (AI)

Sign up to receive CFR President Mike Froman’s analysis on the most important foreign policy story of the week, delivered to your inbox every Friday afternoon. Subscribe to The World This Week. In the Middle East, Israel and Iran are engaged in what could be the most consequential conflict in the region since the wars in Afghanistan and Iraq. CFR’s experts continue to cover all aspects of the evolving conflict on CFR.org. While the situation evolves, including the potential for direct U.S. involvement, it is worth touching on another recent development in the region which could have far-reaching consequences: the diffusion of cutting-edge U.S. artificial intelligence (AI) technology to leading Gulf powers. The defining feature of President Donald Trump’s foreign policy is his willingness to question and, in many cases, reject the prevailing consensus on matters ranging from European security to trade. His approach to AI policy is no exception. Less than six months into his second term, Trump is set to fundamentally rewrite the United States’ international AI strategy in ways that could influence the balance of global power for decades to come. In February, at the Artificial Intelligence Action Summit in Paris, Vice President JD Vance delivered a rousing speech at the Grand Palais, and made it clear that the Trump administration planned to abandon the Biden administration’s safety-centric approach to AI governance in favor of a laissez-faire regulatory regime. “The AI future is not going to be won by hand-wringing about safety,” Vance said. “It will be won by building—from reliable power plants to the manufacturing facilities that can produce the chips of the future.” And as Trump’s AI czar David Sacks put it, “Washington wants to control things, the bureaucracy wants to control things. That’s not a winning formula for technology development. We’ve got to let the private sector cook.” The accelerationist thrust of Vance and Sacks’s remarks is manifesting on a global scale. Last month, during Trump’s tour of the Middle East, the United States announced a series of deals to permit the United Arab Emirates (UAE) and Saudi Arabia to import huge quantities (potentially over one million units) of advanced AI chips to be housed in massive new data centers that will serve U.S. and Gulf AI firms that are training and operating cutting-edge models. These imports were made possible by the Trump administration’s decision to scrap a Biden administration executive order that capped chip exports to geopolitical swing states in the Gulf and beyond, and which represents the most significant proliferation of AI capabilities outside the United States and China to date. The recipe for building and operating cutting-edge AI models has a few key raw ingredients: training data, algorithms (the governing logic of AI models like ChatGPT), advanced chips like Graphics Processing Units (GPUs) or Tensor Processing Units (TPUs)—and massive, power-hungry data centers filled with advanced chips.  Today, the United States maintains a monopoly of only one of these inputs: advanced semiconductors, and more specifically, the design of advanced semiconductors—a field in which U.S. tech giants like Nvidia and AMD, remain far ahead of their global competitors. To weaponize this chokepoint, the first Trump administration and the Biden administration placed a series of ever-stricter export controls on the sale of advanced U.S.-designed AI chips to countries of concern, including China.  The semiconductor export control regime culminated in the final days of the Biden administration with the rollout of the Framework for Artificial Intelligence Diffusion, more commonly known as the AI diffusion rule—a comprehensive global framework for limiting the proliferation of advanced semiconductors. The rule sorted the world into three camps. Tier 1 countries, including core U.S. allies such as Australia, Japan, and the United Kingdom, were exempt from restrictions, whereas tier 3 countries, such as Russia, China, and Iran, were subject to the extremely stringent controls. The core controversy of the diffusion rule stemmed from the tier 2 bucket, which included some 150 countries including India, Mexico, Israel, Switzerland, Saudi Arabia, and the United Arab Emirates. Many tier 2 states, particularly Gulf powers with deep economic and military ties to the United States, were furious.  The rule wasn’t just a matter of how many chips could be imported and by whom. It refashioned how the United States could steer the distribution of computing resources, including the regulation and real-time monitoring of their deployment abroad and the terms by which the technologies can be shared with third parties. Proponents of the restrictions pointed to the need to limit geopolitical swing states’ access to leading AI capabilities and to prevent Chinese, Russian, and other adversarial actors from accessing powerful AI chips by contracting cloud service providers in these swing states.  However, critics of the rule, including leading AI model developers and cloud service providers, claimed that the constraints would stifle U.S. innovation and incentivize tier 2 countries to adopt Chinese AI infrastructure. Moreover, critics argued that with domestic capital expenditures on AI development and infrastructure running into the hundreds of billions of dollars in 2025 alone, fresh capital and scale-up opportunities in the Gulf and beyond represented the most viable option for expanding the U.S. AI ecosystem. This hypothesis is about to be tested in real time. In May, the Trump administration killed the diffusion rule, days before it would have been set into motion, in part to facilitate the export of these cutting-edge chips abroad to the Gulf powers. This represents a fundamental pivot for AI policy, but potentially also in the logic of U.S. grand strategy vis-à-vis China. The most recent era of great power competition, the Cold War, was fundamentally bipolar and the United States leaned heavily on the principle of non-proliferation, particularly in the nuclear domain, to limit the possibility of new entrants. We are now playing by a new set of rules where the diffusion of U.S. technology—and an effort to box out Chinese technology—is of paramount importance. Perhaps maintaining and expanding the United States’ global market share in key AI chokepoint technologies will deny China the scale it needs to outcompete the United States—but it also introduces the risk of U.S. chips falling into the wrong hands via transhipment, smuggling, and other means, or being co-opted by authoritarian regimes for malign purposes.  Such risks are not illusory: there is already ample evidence of Chinese firms using shell entities to access leading-edge U.S. chips through cloud service providers in Southeast Asia. And Chinese firms, including Huawei, were important vendors for leading Gulf AI firms, including the UAE’s G-42, until the U.S. government forced the firm to divest its Chinese hardware as a condition for receiving a strategic investment from Microsoft in 2024. In the United States, the ability to build new data centers is severely constrained by complex permitting processes and limited capacity to bring new power to the grid. What the Gulf countries lack in terms of semiconductor prowess and AI talent, they make up for with abundant capital, energy, and accommodating regulations. The Gulf countries are well-positioned for massive AI infrastructure buildouts. The question is simply, using whose technology—American or Chinese—and on what terms? In Saudi Arabia and the UAE, it will be American technology for now. The question remains whether the diffusion of the most powerful dual-use technologies of our day will bind foreign users to the United States and what impact it will have on the global balance of power.  We welcome your feedback on this column. Let me know what foreign policy issues you’d like me to address next by replying to [email protected].

Iran

As Trump weighs whether to join Israel's bombing campaign of Iran, some have questioned if the president has the authority to involve the U.S. military in this conflict.

RealEcon

The Global Fragility Act (GFA) serves as a blueprint for smart U.S. funding to prevent and end conflict, and bipartisan congressional leaders advocate reauthorization of the 2019 law.